Decentralized Asynchronous Non-convex Stochastic Optimization on Directed Graphs

نویسندگان

چکیده

We consider a decentralized stochastic optimization problem over network of agents, modeled as directed graph: Agents aim to asynchronously minimize the average their individual losses (possibly non-convex), each one having access only noisy estimate gradient its own function. propose an asynchronous distributed algorithm for such class problems. The combines gradients with tracking in push-sum framework and obtains sublinear convergence rate, matching rate centralized descent applied nonconvex minimization. Our experiments on non-convex image classification task using convolutional neural validate our proposed across different number nodes graph connectivity percentages.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asynchronous stochastic convex optimization

We show that asymptotically, completely asynchronous stochastic gradient procedures achieve optimal (even to constant factors) convergence rates for the solution of convex optimization problems under nearly the same conditions required for asymptotic optimality of standard stochastic gradient procedures. Roughly, the noise inherent to the stochastic approximation scheme dominates any noise from...

متن کامل

Asynchronous Non-Convex Optimization for Separable Problems

This paper considers the distributed optimization of a sum of locally observable, nonconvex functions. The optimization is performed over a multi-agent networked system, and each local function depends only on a subset of the variables. An asynchronous and distributed alternating directions method of multipliers (ADMM) method that allows the nodes to defer or skip the computation and transmissi...

متن کامل

Stochastic Successive Convex Approximation for Non-Convex Constrained Stochastic Optimization

This paper proposes a constrained stochastic successive convex approximation (CSSCA) algorithm to find a stationary point for a general non-convex stochastic optimization problem, whose objective and constraint functions are nonconvex and involve expectations over random states. The existing methods for non-convex stochastic optimization, such as the stochastic (average) gradient and stochastic...

متن کامل

ExtraPush for Convex Smooth Decentralized Optimization over Directed Networks

In this note, we extend the existing algorithms Extra [13] and subgradient-push [10] to a new algorithm ExtraPush for convex consensus optimization over a directed network. When the network is stationary, we propose a simplified algorithm called Normalized ExtraPush. These algorithms use a fixed step size like in Extra and accept the column-stochastic mixing matrices like in subgradient-push. W...

متن کامل

Centralized and Decentralized Asynchronous Optimization of Stochastic Discrete Event Systems

We propose and analyze centralized and decentralized asynchronous control structures for the parametric optimization of stochastic Discrete Event Systems (DES) consisting of K distributed components. We use a stochastic approximation type of optimization scheme driven by gradient estimates of a global performance measure with respect to local control parameters. The estimates are obtained in di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Control of Network Systems

سال: 2023

ISSN: ['2325-5870', '2372-2533']

DOI: https://doi.org/10.1109/tcns.2023.3242043